Search results for "Computer Science - Computational Engineering"
showing 10 items of 13 documents
Nanoscale ear drum: graphene based nanoscale sensors.
2012
The difficulty in determining the mass of a sample increases as its size diminishes. At the nanoscale, there are no direct methods for resolving the mass of single molecules or nanoparticles and so more sophisticated approaches based on electromechanical phenomena are required. More importantly, one demands that such nanoelectromechanical techniques could provide not only information about the mass of the target molecules but also about their geometrical properties. In this sense, we report a theoretical study that illustrates in detail how graphene membranes can operate as nanoelectromechanical mass-sensor devices. Wide graphene sheets were exposed to different types and amounts of molecul…
Improving computation efficiency using input and architecture features for a virtual screening application
2023
Virtual screening is an early stage of the drug discovery process that selects the most promising candidates. In the urgent computing scenario it is critical to find a solution in a short time frame. In this paper, we focus on a real-world virtual screening application to evaluate out-of-kernel optimizations, that consider input and architecture features to improve the computation efficiency on GPU. Experiment results on a modern supercomputer node show that we can almost double the performance. Moreover, we implemented the optimization using SYCL and it provides a consistent benefit with the CUDA optimization. A virtual screening campaign can use this gain in performance to increase the nu…
Seam Puckering Objective Evaluation Method for Sewing Process
2015
The paper presents an automated method for the assessment and classification of puckering defects detected during the preproduction control stage of the sewing machine or product inspection. In this respect, we have presented the possible causes and remedies of the wrinkle nonconformities. Subjective factors related to the control environment and operators during the seams evaluation can be reduced using an automated system whose operation is based on image processing. Our implementation involves spectral image analysis using Fourier transform and an unsupervised neural network, the Kohonen Map, employed to classify material specimens, the input images, into five discrete degrees of quality…
Reduced Order Models for Pricing European and American Options under Stochastic Volatility and Jump-Diffusion Models
2016
European options can be priced by solving parabolic partial(-integro) differential equations under stochastic volatility and jump-diffusion models like the Heston, Merton, and Bates models. American option prices can be obtained by solving linear complementary problems (LCPs) with the same operators. A finite difference discretization leads to a so-called full order model (FOM). Reduced order models (ROMs) are derived employing proper orthogonal decomposition (POD). The early exercise constraint of American options is enforced by a penalty on subset of grid points. The presented numerical experiments demonstrate that pricing with ROMs can be orders of magnitude faster within a given model p…
Modeling Business
2003
Business concepts are studied using a metamodel-based approach, using UML 2.0. The Notation Independent Business concepts metamodel is introduced. The approach offers a mapping between different business modeling notations which could be used for bridging BM tools and boosting the MDA approach.
Efficient formulation of a two-noded geometrically exact curved beam element
2021
The article extends the formulation of a 2D geometrically exact beam element proposed by Jirasek et al. (2021) to curved elastic beams. This formulation is based on equilibrium equations in their integrated form, combined with the kinematic relations and sectional equations that link the internal forces to sectional deformation variables. The resulting first-order differential equations are approximated by the finite difference scheme and the boundary value problem is converted to an initial value problem using the shooting method. The article develops the theoretical framework based on the Navier-Bernoulli hypothesis, with a possible extension to shear-flexible beams. Numerical procedures …
Group Importance Sampling for particle filtering and MCMC
2018
Bayesian methods and their implementations by means of sophisticated Monte Carlo techniques have become very popular in signal processing over the last years. Importance Sampling (IS) is a well-known Monte Carlo technique that approximates integrals involving a posterior distribution by means of weighted samples. In this work, we study the assignation of a single weighted sample which compresses the information contained in a population of weighted samples. Part of the theory that we present as Group Importance Sampling (GIS) has been employed implicitly in different works in the literature. The provided analysis yields several theoretical and practical consequences. For instance, we discus…
Semantics of UML 2.0 Activity Diagram for Business Modeling by Means of Virtual Machine
2005
The paper proposes a more formalized definition of UML 2.0 Activity Diagram semantics. A subset of activity diagram constructs relevant for business process modeling is considered. The semantics definition is based on the original token flow methodology, but a more constructive approach is used. The Activity Diagram Virtual machine is defined by means of a metamodel, with operations defined by a mix of pseudocode and OCL pre- and postconditions. A formal procedure is described which builds the virtual machine for any activity diagram. The relatively complicated original token movement rules in control nodes and edges are combined into paths from an action to action. A new approach is the us…
A Two-Stage Reconstruction of Microstructures with Arbitrarily Shaped Inclusions
2020
The main goal of our research is to develop an effective method with a wide range of applications for the statistical reconstruction of heterogeneous microstructures with compact inclusions of any shape, such as highly irregular grains. The devised approach uses multi-scale extended entropic descriptors (ED) that quantify the degree of spatial non-uniformity of configurations of finite-sized objects. This technique is an innovative development of previously elaborated entropy methods for statistical reconstruction. Here, we discuss the two-dimensional case, but this method can be generalized into three dimensions. At the first stage, the developed procedure creates a set of black synthetic …
Compressed Particle Methods for Expensive Models With Application in Astronomy and Remote Sensing
2021
In many inference problems, the evaluation of complex and costly models is often required. In this context, Bayesian methods have become very popular in several fields over the last years, in order to obtain parameter inversion, model selection or uncertainty quantification. Bayesian inference requires the approximation of complicated integrals involving (often costly) posterior distributions. Generally, this approximation is obtained by means of Monte Carlo (MC) methods. In order to reduce the computational cost of the corresponding technique, surrogate models (also called emulators) are often employed. Another alternative approach is the so-called Approximate Bayesian Computation (ABC) sc…